As part of Big Data Challenge on Signalized Intersections 2019, a high-resolution intersection-based signal and flow detection data has observed, processed, and presented in this study. Messo-/Macro-scopic level analyses are performed to diagnosis the performance for each intersection and open-source, R, is used throughout the process since R is one of the fastest and powerful tools for big-data processing and analysis. Unlike traditional intersection performance measurements, this study tried to interpret and deliver current practices as a bird view to help operators and stakeholders. Focusing on specific day of week in the given data, several interactive maps and charts are presented throughout this report. Finally, simple vehicle delay and signal time distributions for each intersection are achieved.
To improve traffic performance measures, the Utah Department of Transportation (UDOT) agreed to release a dataset consisting 22 signalized intersections along two major corridors on Salt Lake City, Utah, United States of America between January 2018 and December 2018. As Figure 1 below describes intersection deployment used for this study, each intersection identified as its unique number along with two major corridors of Salt Lake City, 300 West and 700 East, and each intersection has diverse capacities with dynamic traffic patterns. The given high resolution data set, has recorded with 0.1-second time interval, incorporates with variety of information into a quite simple format, such as vehicle and pedestrian detections and signal phase changes including rings and barriers. Although the sizes of recorded data are relatively small compared with other traditional detector types, such as loop-detectors or blue-tooth-detectors, the data tells quite specific and high-resolution stories of the roads. This achieving and recoding innovation keys are embraced by an effective utilization of the series of lookup tables as a database fashion (Sturdevant et al., 2012).
Figure 1. 300 West and 700 East of Salt Lake CityThere are a few of pioneering studies for the hi-resolution traffic signal data as for the performance measures. Day et al. (2011) suggested optimization functions using site-observed high-resolution detector data to adjust offsets on an arterial system of eight coordinated signals in Noblesville, Indiana. In their research, they utilized green time, arrivals per cycle, and estimated queue size per cycle to minimize delays and stops. Brennan et al. (2011) developed a series of visualization tools to provide educational insight on signal systems under different parameter settings and system behaviors and followings are the list for their visualization: ‘time-of-day schedule change time, observed cycle length, green time/split time, coordinated phase actuation, early return to green, arrivals over advance detection relative to green indication, progression quality characteristics related to offset, adjacent signal synchronization, coordinated phase operation in rest, plan time changes, preemption, impact of queuing, and longitudinal analysis of splits (Brennan et al, 2011).’ Recently, as Federal Highway Administration (FHWA) admitted and decided to promote the utilization of high-resolution data, called automated traffic signal performance measures (ATSPMs), the needs of understanding and interpreting for these data are getting crucial. So far, 11 state departments of transportation (DOTs) and 26 transportation agencies across the United States are pursuing this next generation of traffic signal operating system (FHWA, access date May 21, 2019) and several case studies and measurements are suggested across many state DOTs and agencies (Kimble 2017 and Gault 2018).
All observations from each detector have coded with four categories: SignalID, Time, Event Code, and Parameter. SignalID indicates the geographical location for the detector and this becomes the key information for detector decoding, such as approach, detectors, movement, and lane information. Time is exact recorded time for each observation and Event Code tells whether detection was about vehicle movement or signal changes and this information is highly correlated with Parameter. Finally, Parameter, which explains more details of each Event Code, such as phase information in case of signal detection, detection details in case of vehicle detection. Below figure describes the overall data architecture for the given high-resolution detector data. Figure 2. A general view for data structure
Data pre-processing was performed by splitting signal and detector information for each detector record and allocates appropriate information by belonging event code and event for signal/ DET channel for detector.
Signal timing information from Table 1 was used for obtaining signal allocation distribution. Based on NEMA phase numbers at UDOT, signal time distribution was calculated by phase and by month. In this study, signal time allocation is focused to see how much signal allocation was distributed to each phase for each intersection across the observations. Table 1. Key signal information that used for this study
This information contains quite specific vehicle movements by time, lane and occupancy. However, it was also observed that some of event data information were missing or not existed from the given lookup table. Therefore, this study focuses more on snapshot for operational performance as messo- and macro-scopic level instead of microscopic level approach. Since whole month or whole year data might cause significant misleading regardless of taking advantage of analysis methods, this study also picked specific day of week, i.e., Tuesday, out of week days and Figure 3 shows average vehicle observations by day of week across two corridors. Any observations for detector-off (Event Code #81) after removing duplications were considered as this counts. As graph in Figure 3 shows, about 17% of weekly traffics were observed in Tuesday and this observation is highest than other day of week, such as Wednesday (~14%), Monday (~16%). (Note that you can see actual numbers by hovering mouse and zoom-in or zoom-out by dragging a mouse on the screen).
Figure 3. Average observations by day of weekFurthermore, this study focuses 3000-West corridor during 12 months on every Tuesday due to the fact that the same analysis and approach can apply easily to the other later on.
Simple methodologies were applied to both signal and detector data and the general methodologies are briefly described in this section.
Signal timing information from Table 1 was used for obtaining signal allocation distribution. Based on NEMA phase numbers at UDOT, signal time distribution was calculated by phase and by month. In this study, signal time allocation is focused to see how much signal allocation was distributed to each phase for each intersection across the observations.
For vehicle detector processing at each detection area, 1) a vehicle dwell time between every first ‘detector-on’ and last ‘detector-off’, i.e., EventCode #81 and #82, was assumed as a vehicle delay, while 2) a pair of count between every last ‘detector-on’ and first ‘detector-off’ was assumed as a vehicle count for each approach and lane. As aforementioned, some of EventCodes in detector recordings were not existed on the given detector look-up table information so that it was inevitable to perform secondary filter-out and cleaning processing. Finally, processed vehicle movement data is obtained by lane, time of day, and direction.
A series of figures below shows signal phase distributions for each intersection on every Tuesday from January to December 2018. These phase numbers are referred by NEMA Phase # Convention at UDOT and the distribution ratios for each phase were obtained from monthly signal allocation frequencies for each phase. According to the series of these snapshots, 700-East corridor has more consistent signal distributions for the major direction, i.e., North bound and South bound, in overall (except for Signal ID # 7076), whereas 300-West corridor shows relatively diverse signal distributions. For example, some of intersections on 300-West corridor between Signal ID # 7122 and Signal ID # 7124 have almost major bound focused signal distribution, meanwhile from Signal ID # 7125 to southbound (till Signal ID # 7241) have diverse signal distributions due to other heavy approaching corridors, such as North Temple, 400-South corridor.
(note: you can zoom-in/zoom-out with mouse and click to see detailed numbers for each intersection)
Based on the detector observation, some of informative delay records can be observable by time of day, month of year, and so on. Some of findings are presented in here. As Figure 5a and 5b are showing, observed average delays for each intersection in time of day periods. The first figure, 5a, is Northbound observations, while 5b is Southbound observations. Except for extremely high observations in #7123 in Southbound, three intersections: #7127, #7241, and #7126, are the highest delay observed intersections in both bounds. In overall, afternoon peak time duration, between 15th-19th hour, have more average delays than morning peak time duration, i.e., 6th- 8th hour. Also, Northbound has relatively higher average delays compared with Southbound in morning peak time, and Southbound has afternoon peak time.
(note: You can hover mouse and drag-in/out for zooming-in/out the charts).
cat(“”)Using the traffic counts and delays for each intersection by movement and approach, a simple but interactive traffic count map is presented in following Figure 6 for each month. Also, each circle bar represents average total delay by multiplying each count and average delay for each intersection. For example, in January, intersection #7122 has quite diverse traffic observation across the movements, i.e., right, through, and right, meanwhile traffic observations in intersection #7123, #7127, and #7129 are barely recorded and processed. Also, average total delay (total delay = counts x average delay) showed two intersections: #7125 and #7241 have highest observations among of other intersections.
(note: you can hover to check actual average total delay for each intersection and click to see actual observed traffic counts for each movement)
In this study, a messo-/macro-scopic view of performance measures is presented using open source, R, from raw data processing to visualizing the snapshots. Unlike traditional space-time diagrams or effective green time cumulative curves (Day et al. 2011; Steve, 2017; Steve, 2018), this study focused on bird view of corridor-level performance measurement. Several performance measurements, such as signal allocation distribution, vehicle delay distribution, and vehicle movement and delay for each intersection and movement type, were investigated and presented throughout the study. This study did not include pedestrian related performance measures. However, capturing pedestrian-related performance measurements should not be a problem with applying the same logic. Also, as we focused on messo-/macro-level analysis, many understanding and interpreting for micro-level analysis is neglected; additional performance measures using a micro-level analysis, such as stopped delay, movement delay by lane and movement for each intersection, can be a good practice for in near future.
To conclude, the emerging technology of detector, automated traffic signal performance measures (ATSPMs), brings more detailed and enlarged operational benefits. Like two sides of coin, sophisticated attentions for understanding the data to use becomes more crucial while taking an advantage of the high-resolution data.
[1] Day, C.M., Brennan Jr, T.M., Hainen, A.M., Remias, S.M., Premachandra, H., Sturdevant, J.R., Richards, G., Wasson, J.S. and Bullock, D.M., 2011. Reliability, Flexibility, and Environmental Impact of Alternative Arterial Offset Optimization Objective Functions. Transportation Research Record.
[2] Brennan Jr, T.M., Day, C.M., Sturdevant, J.R. and Bullock, D.M., 2011. Visual Education Tools to Illustrate Coordinated System Operation. Transportation Research Record, 2259(1), pp.59-72.
[3] Sturdevant, J.R., Overman, T., Raamot, E., Deer, R., Miller, D., Bullock, D.M., Day, C.M., Brennan Jr, T.M., Li, H., Hainen, A. and Remias, S.M., 2012. Indiana traffic signal hi resolution data logger enumerations.
[4] Steve Kimble, 2017, Leveraging Hi-Res Data for Signalized Corridor Monitoring, ITS Midwest Conference.
[5] FHWA, Automated Traffic Signal Performance Measures (ATSPMs), (https://www.fhwa.dot.gov/innovation/everydaycounts/edc_4/atspm.cfm: access date 21/May/2019)
[6] Steve Gault, 2018, Automated Traffic Signal Performance Measures, Penn State Engineering and Safety Conference.